1 |
One model for the learning of language.
|
|
|
|
In: Proceedings of the National Academy of Sciences of the United States of America, vol 119, iss 5 (2022)
|
|
BASE
|
|
Show details
|
|
2 |
One model for the learning of language
|
|
|
|
In: Proc Natl Acad Sci U S A (2022)
|
|
BASE
|
|
Show details
|
|
3 |
Developing an Automatic System for Classifying Chatter About Health Services on Twitter: Case Study for Medicaid
|
|
|
|
In: J Med Internet Res (2021)
|
|
BASE
|
|
Show details
|
|
4 |
Self-reported COVID-19 symptoms on Twitter: an analysis and a research resource
|
|
|
|
In: J Am Med Inform Assoc (2020)
|
|
BASE
|
|
Show details
|
|
5 |
A Light-Weight Text Summarization System for Fast Access to Medical Evidence
|
|
|
|
In: Front Digit Health (2020)
|
|
BASE
|
|
Show details
|
|
6 |
Entwicklung interkultureller Handlungskompetenz : ein didaktisches Konzept für den Wirtschaftsdeutschunterricht in China am Beispiel des Einsatzes von Lernvideos
|
|
Yang, Yuan [Verfasser]. - München : Iudicium, 2019
|
|
DNB Subject Category Language
|
|
Show details
|
|
7 |
Entwicklung interkultureller Handlungskompetenz. Ein didaktisches Konzept für den Wirtschaftsdeutschunterricht in China am Beispiel des Einsatzes von Lernvideos
|
|
|
|
DNB Subject Category Language
|
|
Show details
|
|
8 |
Punctuation and Parallel Corpus Based Word Embedding Model for Low-Resource Languages
|
|
|
|
In: Information ; Volume 11 ; Issue 1 (2019)
|
|
BASE
|
|
Show details
|
|
9 |
One Model for the Learning of Language ...
|
|
|
|
Abstract:
A major target of linguistics and cognitive science has been to understand what class of learning systems can acquire the key structures of natural language. Until recently, the computational requirements of language have been used to argue that learning is impossible without a highly constrained hypothesis space. Here, we describe a learning system that is maximally unconstrained, operating over the space of all computations, and is able to acquire several of the key structures present natural language from positive evidence alone. The model successfully acquires regular (e.g. $(ab)^n$), context-free (e.g. $a^n b^n$, $x x^R$), and context-sensitive (e.g. $a^nb^nc^n$, $a^nb^mc^nd^m$, $xx$) formal languages. Our approach develops the concept of factorized programs in Bayesian program induction in order to help manage the complexity of representation. We show in learning, the model predicts several phenomena empirically observed in human grammar acquisition experiments. ... : This is a draft write-up of an undergraduate project. A full journal version is still under preparation ...
|
|
Keyword:
Artificial Intelligence cs.AI; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.1711.06301 https://arxiv.org/abs/1711.06301
|
|
BASE
|
|
Hide details
|
|
|
|